Radius Margin Bounds for Support Vector . . .

نویسندگان

  • Kai-Min Chung
  • Wei-Chun Kao
  • Chia-Liang Sun
  • Li-Lun Wang
  • Chih-Jen Lin
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tuning L1-SVM Hyperparameters with Modified Radius Margin Bounds and Simulated Annealing

In the design of support vector machines an important step is to select the optimal hyperparameters. One of the most used estimators of the performance is the Radius-Margin bound. Some modifications of this bound have been made to adapt it to soft margin problems, giving a convex optimization problem for the L2 soft margin formulation. However, it is still interesting to consider the L1 case du...

متن کامل

Radius-margin Bound on the Leave-one- out Error of the Llw-m-svm

To set the values of the hyperparameters of a support vector machine (SVM), one can use cross-validation. Its leave-one-out variant produces an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirement. To overcome this difficulty, several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. The most pop...

متن کامل

A Quadratic Loss Multi-Class SVM for which a Radius-Margin Bound Applies

To set the values of the hyperparameters of a support vector machine (SVM), the method of choice is cross-validation. Several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. One of the most popular is the radius–margin bound. It applies to the hard margin machine, and, by extension, to the 2-norm SVM. In this article, we introduce the first quadratic lo...

متن کامل

Radius-Margin Bound on the Leave-One-Out Error of a M-SVM

Using a support vector machine (SVM) requires to set the values of two types of hyperparameters: the soft margin parameter C and the parameters of the kernel. To perform this model selection task, the method of choice is cross-validation. Its leave-one-out variant is known to produce an estimator of the generalization error which is almost unbiased. Its major drawback rests in its time requirem...

متن کامل

Multi-objective Model Selection for Support Vector Machines

In this article, model selection for support vector machines is viewed as a multi-objective optimization problem, where model complexity and training accuracy define two conflicting objectives. Different optimization criteria are evaluated: Split modified radius margin bounds, which allow for comparing existing model selection criteria, and the training error in conjunction with the number of s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003